50 research outputs found

    The logic and mathematics of occasion sentences

    Get PDF
    The prime purpose of this paper is, first, to restore to discourse-bound occasion sentences their rightful central place in semantics and secondly, taking these as the basic propositional elements in the logical analysis of language, to contribute to the development of an adequate logic of occasion sentences and a mathematical (Boolean) foundation for such a logic, thus preparing the ground for more adequate semantic, logical and mathematical foundations of the study of natural language. Some of the insights elaborated in this paper have appeared in the literature over the past thirty years, and a number of new developments have resulted from them. The present paper aims atproviding an integrated conceptual basis for this new development in semantics. In Section 1 it is argued that the reduction by translation of occasion sentences to eternal sentences, as proposed by Russell and Quine, is semantically and thus logically inadequate. Natural language is a system of occasion sentences, eternal sentences being merely boundary cases. The logic hasfewer tasks than is standardly assumed, as it excludes semantic calculi, which depend crucially on information supplied by cognition and context and thus belong to cognitive psychology rather than to logic. For sentences to express a proposition and thus be interpretable and informative, they must first be properly anchored in context. A proposition has a truth value when it is, moreover, properly keyed in the world, i.e. is about a situation in the world. Section 2 deals with the logical properties of natural language. It argues that presuppositional phenomena require trivalence and presents the trivalent logic PPC3, with two kinds of falsity and two negations. It introduces the notion of Σ-space for a sentence A (or A/A, the set of situations in which A is true) as the basis of logical model theory, and the notion of PA/ (the Σ-space of the presuppositions of A), functioning as a `private' subuniverse for A/A. The trivalent Kleene calculus is reinterpreted as a logical account of vagueness, rather than of presupposition. PPC3 and the Kleene calculus are refinements of standard bivalent logic and can be combined into one logical system. In Section 3 the adequacy of PPC3 as a truth-functional model of presupposition is considered more closely and given a Boolean foundation. In a noncompositional extended Boolean algebra, three operators are defined: 1a for the conjoined presuppositions of a, ã for the complement of a within 1a, and â for the complement of 1a within Boolean 1. The logical properties of this extended Boolean algebra are axiomatically defined and proved for all possible models. Proofs are provided of the consistency and the completeness of the system. Section 4 is a provisional exploration of the possibility of using the results obtained for a new discourse-dependent account of the logic of modalities in natural language. The overall result is a modified and refined logical and model-theoretic machinery, which takes into account both the discourse-dependency of natural language sentences and the necessity of selecting a key in the world before a truth value can be assigne

    Partiality, revisited: the partiality monad as a quotient inductive-inductive type

    Get PDF
    Capretta's delay monad can be used to model partial computations, but it has the "wrong" notion of built-in equality, strong bisimilarity. An alternative is to quotient the delay monad by the "right" notion of equality, weak bisimilarity. However, recent work by Chapman et al. suggests that it is impossible to define a monad structure on the resulting construction in common forms of type theory without assuming (instances of) the axiom of countable choice. Using an idea from homotopy type theory - a higher inductive-inductive type - we construct a partiality monad without relying on countable choice. We prove that, in the presence of countable choice, our partiality monad is equivalent to the delay monad quotiented by weak bisimilarity. Furthermore we outline several applications

    Is Venice an ideal habitat for Legionella pneumophila?

    Get PDF
    Introduction. Legionella bacterium manifests itself in Legion- naire?s disease and Pontiac fever, it is mainly found and trans- mitted by aerosol produced in cooling towers, water distribution plants and medical equipment, and it affects mainly elder persons in poor health. Methods. The population of Venice Local Health Unit was divided in two areas of study and the incidence of legionellosis in residents of Venice historical centre (Distretto Sanitario 1) and in residents of the mainland and coastal areas (Distretti Sanitari 2, 3, 4) was calculated. The cases were those notified to the Public Health Unit by law, and the population of residents was that of the eligible for health care in the archives of the Local Health Unit. Only cases of legionellosis in residents who had not travelled in the 10 days previous of the onset of disease, and not related to nosocomial clusters were considered. The standardized incidence ratio was then calculated and confidence interval were defined by Poisson distribution. Results. Given the population of the two areas, 59801 in Distretto Sanitario 1 and 237555 in Distretti 2, 3, 4, the raw incidence of disease is respectively 87 per 100000 and 20 per 100000 in time 2002-2010. The standardized incidence ratio for the population of Distretto Sanitario 1 vs the remaining population is 4.3. Discussion. The difference in risk of getting the disease in this two residential areas geographically very close, is probably related to the different buildings? characteristics, old and difficult to main- tain in Venice historical centre

    A coalgebraic view of bar recursion and bar induction

    Get PDF
    We reformulate the bar recursion and induction principles in terms of recursive and wellfounded coalgebras. Bar induction was originally proposed by Brouwer as an axiom to recover certain classically valid theorems in a constructive setting. It is a form of induction on non- wellfounded trees satisfying certain properties. Bar recursion, introduced later by Spector, is the corresponding function defnition principle. We give a generalization of these principles, by introducing the notion of barred coalgebra: a process with a branching behaviour given by a functor, such that all possible computations terminate. Coalgebraic bar recursion is the statement that every barred coalgebra is recursive; a recursive coalgebra is one that allows defnition of functions by a coalgebra-to-algebra morphism. It is a framework to characterize valid forms of recursion for terminating functional programs. One application of the principle is the tabulation of continuous functions: Ghani, Hancock and Pattinson defned a type of wellfounded trees that represent continuous functions on streams. Bar recursion allows us to prove that every stably continuous function can be tabulated to such a tree where by stability we mean that the modulus of continuity is also continuous. Coalgebraic bar induction states that every barred coalgebra is well-founded; a wellfounded coalgebra is one that admits proof by induction

    From coinductive proofs to exact real arithmetic: theory and applications

    Full text link
    Based on a new coinductive characterization of continuous functions we extract certified programs for exact real number computation from constructive proofs. The extracted programs construct and combine exact real number algorithms with respect to the binary signed digit representation of real numbers. The data type corresponding to the coinductive definition of continuous functions consists of finitely branching non-wellfounded trees describing when the algorithm writes and reads digits. We discuss several examples including the extraction of programs for polynomials up to degree two and the definite integral of continuous maps

    Termination Casts: A Flexible Approach to Termination with General Recursion

    Full text link
    This paper proposes a type-and-effect system called Teqt, which distinguishes terminating terms and total functions from possibly diverging terms and partial functions, for a lambda calculus with general recursion and equality types. The central idea is to include a primitive type-form "Terminates t", expressing that term t is terminating; and then allow terms t to be coerced from possibly diverging to total, using a proof of Terminates t. We call such coercions termination casts, and show how to implement terminating recursion using them. For the meta-theory of the system, we describe a translation from Teqt to a logical theory of termination for general recursive, simply typed functions. Every typing judgment of Teqt is translated to a theorem expressing the appropriate termination property of the computational part of the Teqt term.Comment: In Proceedings PAR 2010, arXiv:1012.455

    Formalization of Transform Methods using HOL Light

    Full text link
    Transform methods, like Laplace and Fourier, are frequently used for analyzing the dynamical behaviour of engineering and physical systems, based on their transfer function, and frequency response or the solutions of their corresponding differential equations. In this paper, we present an ongoing project, which focuses on the higher-order logic formalization of transform methods using HOL Light theorem prover. In particular, we present the motivation of the formalization, which is followed by the related work. Next, we present the task completed so far while highlighting some of the challenges faced during the formalization. Finally, we present a roadmap to achieve our objectives, the current status and the future goals for this project.Comment: 15 Pages, CICM 201

    Generic point-free lenses

    Get PDF
    Lenses are one the most popular approaches to define bidirectional transformations between data models. A bidirectional transformation with view-update, denoted a lens, encompasses the definition of a forward transformation projecting concrete models into abstract views, together with a backward transformation instructing how to translate an abstract view to an update over concrete models. In this paper we show that most of the standard point-free combinators can be lifted to lenses with suitable backward semantics, allowing us to use the point-free style to define powerful bidirectional transformations by composition. We also demonstrate how to define generic lenses over arbitrary inductive data types by lifting standard recursion patterns, like folds or unfolds. To exemplify the power of this approach, we “lensify” some standard functions over naturals and lists, which are tricky to define directly “by-hand” using explicit recursion

    Quotienting the Delay Monad by Weak Bisimilarity

    Get PDF
    The delay datatype was introduced by Capretta as a means to deal with partial functions (as in computability theory) in Martin-Löf type theory. It is a monad and it constitutes a constructive alternative to the maybe monad. It is often desirable to consider two delayed computations equal, if they terminate with equal values, whenever one of them terminates. The equivalence relation underlying this identification is called weak bisimilarity. In type theory, one commonly replaces quotients with setoids. In this approach, the delay monad quotiented by weak bisimilarity is still a monad. In this paper, we consider Hofmann's alternative approach of extending type theory with inductive-like quotient types. In this setting, it is difficult to define the intended monad multiplication for the quotiented datatype. We give a solution where we postulate some principles, crucially proposition extensionality and the (semi-classical) axiom of countable choice. We have fully formalized our results in the Agda dependently typed programming language

    The coinductive formulation of common knowledge

    Get PDF
    We study the coinductive formulation of common knowledge in type theory. We formalise both the traditional relational semantics and an operator semantics, similar in form to the epistemic system S5, but at the level of events on possible worlds rather than as a logical derivation system. We have two major new results. Firstly, the operator semantics is equivalent to the relational semantics: we discovered that this requires a new hypothesis of semantic entailment on operators, not known in previous literature. Secondly, the coinductive version of common knowledge is equivalent to the traditional transitive closure on the relational interpretation. All results are formalised in the proof assistants Agda and Coq
    corecore